45 research outputs found

    A Machine Learning Concept for DTN Routing

    Get PDF
    This paper discusses the concept and architecture of a machine learning based router for delay tolerant space networks. The techniques of reinforcement learning and Bayesian learning are used to supplement the routing decisions of the popular Contact Graph Routing algorithm. An introduction to the concepts of Contact Graph Routing, Q-routing and Naive Bayes classification are given. The development of an architecture for a cross-layer feedback framework for DTN (Delay-Tolerant Networking) protocols is discussed. Finally, initial simulation setup and results are given

    Evaluation of Classifier Complexity for Delay Tolerant Network Routing

    Get PDF
    The growing popularity of small cost effective satellites (SmallSats, CubeSats, etc.) creates the potential for a variety of new science applications involving multiple nodes functioning together or independently to achieve a task, such as swarms and constellations. As this technology develops and is deployed for missions in Low Earth Orbit and beyond, the use of delay tolerant networking (DTN) techniques may improve communication capabilities within the network. In this paper, a network hierarchy is developed from heterogeneous networks of SmallSats, surface vehicles, relay satellites and ground stations which form an integrated network. There is a tradeoff between complexity, flexibility, and scalability of user defined schedules versus autonomous routing as the number of nodes in the network increases. To address these issues, this work proposes a machine learning classifier based on DTN routing metrics. A framework is developed which will allow for the use of several categories of machine learning algorithms (decision tree, random forest and deep learning) to be applied to a dataset of historical network statistics, which allows for the evaluation of algorithm complexity versus performance to be explored. We develop the emulation of a hierarchical network, consisting of tens of nodes which form a cognitive network architecture. CORE (Common Open Research Emulator) is used to emulate the network using bundle protocol and DTN IP neighbor discovery

    VITALAS at TRECVID-2008

    Get PDF
    In this paper, we present our experiments in TRECVID 2008 about High-Level feature extraction task. This is the first year for our participation in TRECVID, our system adopts some popular approaches that other workgroups proposed before. We proposed 2 advanced low-level features NEW Gabor texture descriptor and the Compact-SIFT Codeword histogram. Our system applied well-known LIBSVM to train the SVM classifier for the basic classifier. In fusion step, some methods were employed such as the Voting, SVM-base, HCRF and Bootstrap Average AdaBoost(BAAB)

    Data distribution in a wireless environment with migrating nodes

    Get PDF
    The introduction of mobile wireless devices brings unique challenges for distribution of data to many devices simultaneously. An optimizing multicast methodology called Probabilistic Multicast Trees (PMT) is extended to handle mobile wireless devices. We will show that PMT multiple tree multicast system is well suited to this mobile dynamic environmentFacultad de Informátic

    How telemedicine can improve the quality of care for patients with alzheimer’s disease and related dementias? A narrative review

    Get PDF
    Background and Objectives: Dementia affects more than 55 million patients worldwide, with a significant societal, economic, and psychological impact. However, many patients with Alzheimer’s disease (AD) and other related dementias have limited access to effective and individualized treatment. Care provision for dementia is often unequal, fragmented, and inefficient. The COVID-19 pandemic accelerated telemedicine use, which holds promising potential for addressing this important gap. In this narrative review, we aim to analyze and discuss how telemedicine can improve the quality of healthcare for AD and related dementias in a structured manner, based on the seven dimensions of healthcare quality defined by the World Health Organization (WHO), 2018: effectiveness, safety, people-centeredness, timeliness, equitability, integrated care, and efficiency. Materials and Methods: MEDLINE and Scopus databases were searched for peer-reviewed articles investigating the role of telemedicine in the quality of care for patients with dementia. A narrative synthesis was based on the seven WHO dimensions. Results: Most studies indicate that telemedicine is a valuable tool for AD and related dementias: it can improve effectiveness (better access to specialized care, accurate diagnosis, evidence-based treatment, avoidance of preventable hospitalizations), timeliness (reduction of waiting times and unnecessary transportation), patient-centeredness (personalized care for needs and values), safety (appropriate treatment, reduction of infection risk),integrated care (interdisciplinary approach through several dementia-related services), efficiency (mainly cost-effectiveness) and equitability (overcoming geographical barriers, cultural diversities). However, digital illiteracy, legal and organizational issues, as well as limited awareness, are significant potential barriers. Conclusions: Telemedicine may significantly improve all aspects of the quality of care for patients with dementia. However, future longitudinal studies with control groups including participants of a wide educational level spectrum will aid in our deeper understanding of the real impact of telemedicine in quality care for this population

    Methods on LDL particle isolation, characterization, and component fractionation for the development of novel specific oxidized LDL status markers for atherosclerotic disease risk assessment

    Get PDF
    The present study uses simple, innovative methods to isolate, characterize and fractionate LDL in its main components for the study of specific oxidations on them that characterize oxidized low-density lipoprotein (oxLDL) status, as it causatively relates to atherosclerosis-associated cardiovascular disease (CVD) risk assessment. These methods are: (a) A simple, relatively time-short, low cost protocol for LDL isolation, to avoid shortcomings of the currently employed ultracentrifugation and affinity chromatography methodologies. (b) LDL purity verification by apoB100 SDS-PAGE analysis and by LDL particle size determination; the latter and its serum concentration are determined in the present study by a simple method more clinically feasible as marker of CVD risk assessment than nuclear magnetic resonance. (c) A protocol for LDL fractionation, for the first time, into its main protein/lipid components (apoB100, phospholipids, triglycerides, free cholesterol, and cholesteryl esters), as well as into LDL carotenoid/tocopherol content. (d) Protocols for the measurement, for the first time, of indicative specific LDL component oxidative modifications (cholesteryl ester-OOH, triglyceride-OOH, free cholesterol-OOH, phospholipid-OOH, apoB100-MDA, and apoB100-DiTyr) out of the many (known/unknown/under development) that collectively define oxLDL status, which contrasts with the current non-specific oxLDL status evaluation methods. The indicative oxLDL status markers, selected in the present study on the basis of expressing early oxidative stress-induced oxidative effects on LDL, are studied for the first time on patients with end stage kidney disease on maintenance hemodialysis, selected as an indicative model for atherosclerosis associated diseases. Isolating LDL and fractionating its protein and main lipid components, as well as its antioxidant arsenal comprised of carotenoids and tocopherols, paves the way for future studies to investigate all possible oxidative modifications responsible for turning LDL to oxLDL in association to their possible escaping from LDL’s internal antioxidant defense. This can lead to studies to identify those oxidative modifications of oxLDL (after their artificial generation on LDL), which are recognized by macrophages and convert them to foam cells, known to be responsible for the formation of atherosclerotic plaques that lead to the various CVDs

    Diagnostic strategy and timing of intervention in infected necrotizing pancreatitis: an international expert survey and case vignette study

    Get PDF
    AbstractBackgroundThe optimal diagnostic strategy and timing of intervention in infected necrotizing pancreatitis is subject to debate. We performed a survey on these topics amongst a group of international expert pancreatologists.MethodsAn online survey including case vignettes was sent to 118 international pancreatologists. We evaluated the use and timing of fine needle aspiration (FNA), antibiotics, catheter drainage and (minimally invasive) necrosectomy.ResultsThe response rate was 74% (N = 87). None of the respondents use FNA routinely, 85% selectively and 15% never. Most respondents (87%) use a step-up approach in patients with infected necrosis. Walled-off necrosis (WON) is considered a prerequisite for endoscopic drainage and percutaneous drainage by 66% and 12%, respectively. After diagnosing infected necrosis, 55% routinely postpone invasive interventions, whereas 45% proceed immediately to intervention. Lack of consensus about timing of intervention was apparent on day 14 with proven infected necrosis (58% intervention vs. 42% non-invasive) as well as on day 20 with only clinically suspected infected necrosis (59% intervention vs. 41% non-invasive).DiscussionThe step-up approach is the preferred treatment strategy in infected necrotizing pancreatitis amongst expert pancreatologists. There is no uniformity regarding the use of FNA and timing of intervention in the first 2–3 weeks of infected necrotizing pancreatitis

    Firmware engineering and microprogramming at the University of Cincinnati

    No full text
    corecore